A neural representation of sequential states within an instructed task.

نویسندگان

  • Michael Campos
  • Boris Breznen
  • Richard A Andersen
چکیده

In the study of the neural basis of sensorimotor transformations, it has become clear that the brain does not always wait to sense external events and afterward select the appropriate responses. If there are predictable regularities in the environment, the brain begins to anticipate the timing of instructional cues and the signals to execute a response, revealing an internal representation of the sequential behavioral states of the task being performed. To investigate neural mechanisms that could represent the sequential states of a task, we recorded neural activity from two oculomotor structures implicated in behavioral timing--the supplementary eye fields (SEF) and the lateral intraparietal area (LIP)--while rhesus monkeys performed a memory-guided saccade task. The neurons of the SEF were found to collectively encode the progression of the task with individual neurons predicting and/or detecting states or transitions between states. LIP neurons, while also encoding information about the current temporal interval, were limited with respect to SEF neurons in two ways. First, LIP neurons tended to be active when the monkey was planning a saccade but not in the precue or intertrial intervals, whereas SEF neurons tended to have activity modulation in all intervals. Second, the LIP neurons were more likely to be spatially tuned than SEF neurons. SEF neurons also show anticipatory activity. The state-selective and anticipatory responses of SEF neurons support two complementary models of behavioral timing, state dependent and accumulator models, and suggest that each model describes a contribution SEF makes to timing at different temporal resolutions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential State Representations

1 1 2 Title: 3 A NEURAL REPRESENTATION OF SEQUENTIAL STATES 4 WITHIN AN INSTRUCTED TASK 5 6 Abbreviated title: Sequential State Representations 7 8 9 Authors: Michael Campos, Boris Breznen, and Richard A. Andersen 10 Computation and Neural Systems, Division of Biology, Caltech 11 Corresponding author: Michael Campos, MC 216-76 Caltech, Pasadena, CA 91125 12 Corresponding author email: mcampos@c...

متن کامل

Local Learning Algorithms for Sequential Tasks in Neural Networks

In this paper we explore the concept of sequential learning and the efficacy of global and local neural network learning algorithms on a sequential learning task. Pseudorehearsal (a method developed by Robins [19] to solve the catastrophic forgetting problem which arises from the excessive plasticity of neural networks) is significantly more effective than other local learning algorithms for th...

متن کامل

Representation of interval timing by temporally scalable firing patterns in rat prefrontal cortex.

Perception of time interval on the order of seconds is an essential component of cognition, but the underlying neural mechanism remains largely unknown. In rats trained to estimate time intervals, we found that many neurons in the medial prefrontal cortex (PFC) exhibited sustained spiking activity with diverse temporal profiles of firing-rate modulation during the time-estimation period. Intere...

متن کامل

Representation of immediate and final behavioral goals in the monkey prefrontal cortex during an instructed delay period.

We examined neuronal activity in the lateral prefrontal cortex of monkeys performing a path-planning task in a maze that required the planning of actions in multiple steps. The animals received an instruction that prompted them to prepare to move a cursor in the maze stepwise from a starting position to a goal position by operating manipulanda with either arm. During a delay period in which the...

متن کامل

Neuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion

In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of neurophysiology

دوره 104 5  شماره 

صفحات  -

تاریخ انتشار 2010